66 research outputs found

    Interpreting gains and losses in conceptual test using Item Response Theory

    Full text link
    Conceptual tests are widely used by physics instructors to assess students' conceptual understanding and compare teaching methods. It is common to look at students' changes in their answers between a pre-test and a post-test to quantify a transition in student's conceptions. This is often done by looking at the proportion of incorrect answers in the pre-test that changes to correct answers in the post-test -- the gain -- and the proportion of correct answers that changes to incorrect answers -- the loss. By comparing theoretical predictions to experimental data on the Force Concept Inventory, we shown that Item Response Theory (IRT) is able to fairly well predict the observed gains and losses. We then use IRT to quantify the student's changes in a test-retest situation when no learning occurs and show that i)i) up to 25\% of total answers can change due to the non-deterministic nature of student's answer and that ii)ii) gains and losses can go from 0\% to 100\%. Still using IRT, we highlight the conditions that must satisfy a test in order to minimize gains and losses when no learning occurs. Finally, recommandations on the interpretation of such pre/post-test progression with respect to the initial level of students are proposed

    Can Dark Energy emerge from quantum effects in compact extra dimension ?

    Full text link
    The origin of the observed acceleration of the expansion of the universe is a major problem of modern cosmology and theoretical physics. Simple estimations of the contribution of vacuum to the density energy of the universe in quantum field theory are known to lead to catastrophic large values compared to observations. Such a contribution is therefore generally not regarded as a viable source for the acceleration of the expansion. In this letter we propose that the vacuum contribution actually provides a small positive value to the density energy of the universe. The underlying mechanism is a manifestation of the quantum nature of the gravitational field, through a Casimir-like effect from an additional compact dimension of space. A key ingredient is to assume that only modes with wavelength shorter than the Hubble length contribute to the vacuum. Such a contribution gives a positive energy density, has a Lorentz invariant equation of state in the usual 4D spacetime and hence can be interpreted as a cosmological constant. Its value agrees with observations for a radius of a 5th extra dimension given by 35 μ35\,\mum. This implies a modification of the gravitational inverse square law around this scale, close but below existing limits from experiments testing gravity at short range.Comment: To be published in A\&

    Gravitational decoherence of atomic interferometers

    Get PDF
    We study the decoherence of atomic interferometers due to the scattering of stochastic gravitational waves. We evaluate the `direct' gravitational effect registered by the phase of the matter waves as well as the `indirect' effect registered by the light waves used as beam-splitters and mirrors for the matter waves. Considering as an example the space project HYPER, we show that both effects are negligible for the presently studied interferometers.Comment: 12 pages, 4 figure

    Gravitational waves, diffusion and decoherence

    Full text link
    The quite different behaviors exhibited by microscopic and macroscopic systems with respect to quantum interferences suggest that there may exist a naturally frontier between quantum and classical worlds. The value of the Planck mass (22ÎĽ\mug) may lead to the idea of a connection between this borderline and intrinsic fluctuations of spacetime. We show that it is possible to obtain quantitative answers to these questions by studying the diffusion and decoherence mechanisms induced on quantum systems by gravitational waves generated at the galactic or cosmic scales. We prove that this universal fluctuating environment strongly affects quantum interferences on macroscopic systems, while leaving essentially untouched those on microscopic systems. We obtain the relevant parameters which, besides the ratio of the system's mass to Planck mass, characterize the diffusion constant and decoherence time. We discuss the feasibility of experiments aiming at observing these effects in the context of ongoing progress towards more and more sensitive matter-wave interferometry.Comment: Notes for two lectures given at the International School of Physics Enrico Fermi on Atom Optics and Space Physics (Varenna, July 2007

    Performance Evaluation And Anomaly detection in Mobile BroadBand Across Europe

    Get PDF
    With the rapidly growing market for smartphones and user’s confidence for immediate access to high-quality multimedia content, the delivery of video over wireless networks has become a big challenge. It makes it challenging to accommodate end-users with flawless quality of service. The growth of the smartphone market goes hand in hand with the development of the Internet, in which current transport protocols are being re-evaluated to deal with traffic growth. QUIC and WebRTC are new and evolving standards. The latter is a unique and evolving standard explicitly developed to meet this demand and enable a high-quality experience for mobile users of real-time communication services. QUIC has been designed to reduce Web latency, integrate security features, and allow a highquality experience for mobile users. Thus, the need to evaluate the performance of these rising protocols in a non-systematic environment is essential to understand the behavior of the network and provide the end user with a better multimedia delivery service. Since most of the work in the research community is conducted in a controlled environment, we leverage the MONROE platform to investigate the performance of QUIC and WebRTC in real cellular networks using static and mobile nodes. During this Thesis, we conduct measurements ofWebRTC and QUIC while making their data-sets public to the interested experimenter. Building such data-sets is very welcomed with the research community, opening doors to applying data science to network data-sets. The development part of the experiments involves building Docker containers that act as QUIC and WebRTC clients. These containers are publicly available to be used candidly or within the MONROE platform. These key contributions span from Chapter 4 to Chapter 5 presented in Part II of the Thesis. We exploit data collection from MONROE to apply data science over network data-sets, which will help identify networking problems shifting the Thesis focus from performance evaluation to a data science problem. Indeed, the second part of the Thesis focuses on interpretable data science. Identifying network problems leveraging Machine Learning (ML) has gained much visibility in the past few years, resulting in dramatically improved cellular network services. However, critical tasks like troubleshooting cellular networks are still performed manually by experts who monitor the network around the clock. In this context, this Thesis contributes by proposing the use of simple interpretable ML algorithms, moving away from the current trend of high-accuracy ML algorithms (e.g., deep learning) that do not allow interpretation (and hence understanding) of their outcome. We prefer having lower accuracy since we consider it interesting (anomalous) the scenarios misclassified by the ML algorithms, and we do not want to miss them by overfitting. To this aim, we present CIAN (from Causality Inference of Anomalies in Networks), a practical and interpretable ML methodology, which we implement in the form of a software tool named TTrees (from Troubleshooting Trees) and compare it to a supervised counterpart, named STress (from Supervised Trees). Both methodologies require small volumes of data and are quick at training. Our experiments using real data from operational commercial mobile networks e.g., sampled with MONROE probes, show that STrees and CIAN can automatically identify and accurately classify network anomalies—e.g., cases for which a low network performance is not justified by operational conditions—training with just a few hundreds of data samples, hence enabling precise troubleshooting actions. Most importantly, our experiments show that a fully automated unsupervised approach is viable and efficient. In Part III of the Thesis which includes Chapter 6 and 7. In conclusion, in this Thesis, we go through a data-driven networking roller coaster, from performance evaluating upcoming network protocols in real mobile networks to building methodologies that help identify and classify the root cause of networking problems, emphasizing the fact that these methodologies are easy to implement and can be deployed in production environments.This work has been supported by IMDEA Networks InstitutePrograma de Doctorado en Multimedia y Comunicaciones por la Universidad Carlos III de Madrid y la Universidad Rey Juan CarlosPresidente: Matteo Sereno.- Secretario: Antonio de la Oliva Delgado.- Vocal: Raquel Barco Moren

    Introduction of interactive learning into French university physics classrooms

    Full text link
    We report on a project to introduce interactive learning strategies (ILS) to physics classes at the Universit\'e Pierre et Marie Curie (UPMC), one of the leading science universities in France. In Spring 2012, instructors in two large introductory classes, first-year, second-semester mechanics, and second-year introductory E&M, enrolling approximately 500 and 250 students respectively, introduced ILS into some sections of each class. The specific ILS utilized were Think-Pair-Share questions and Peer Instruction in the main lecture classrooms, and UW Tutorials for Introductory Physics in recitation sections. Pre- and post-instruction assessments (FCI and CSEM respectively) were given, along with a series of demographics questions. We were able to compare the results of the FCI and CSEM between interactive and non-interactive classes taught simultaneously with the same curriculum. We also analyzed final exam results, as well as the results of student and instructor attitude surveys between classes. In our analysis, we argue that Multiple Linear Regression modeling is superior to other common analysis tools, including normalized gain. Our results show that ILS are effective at improving student learning by all measures used: research-validated concept inventories and final exam scores, on both conceptual and traditional problem-solving questions. Multiple Linear Regression analysis reveals that interactivity in the classroom is a significant predictor of student learning, showing a similar or stronger relationship with student learning than such ascribed characteristics as parents' education, and achieved characteristics such as GPA and hours studied per week. Analysis of student and instructors attitudes shows that both groups believe that ILS improve student learning in the physics classroom, and increases student engagement and motivation

    HYPER and gravitational decoherence

    Full text link
    We study the decoherence process associated with the scattering of stochastic backgrounds of gravitational waves. We show that it has a negligible influence on HYPER-like atomic interferometers although it may dominate decoherence of macroscopic motions, such as the planetary motion of the Moon around the Earth.Comment: 9 pages, 4 figures, HYPER Symposium 2002 atomoptic.iota.u-psud.fr/hyper

    Dark sectors of the Universe: A Euclid survey approach

    Full text link
    In this paper we study the consequences of relaxing the hypothesis of the pressureless nature of the dark matter component when determining constraints on dark energy. To this aim we consider simple generalized dark matter models with constant equation of state parameter. We find that present-day low-redshift probes (type-Ia supernovae and baryonic acoustic oscillations) lead to a complete degeneracy between the dark energy and the dark matter sectors. However, adding the cosmic microwave background (CMB) high-redshift probe restores constraints similar to those on the standard Λ\LambdaCDM model. We then examine the anticipated constraints from the galaxy clustering probe of the future Euclid survey on the same class of models, using a Fisher forecast estimation. We show that the Euclid survey allows us to break the degeneracy between the dark sectors, although the constraints on dark energy are much weaker than with standard dark matter. The use of CMB in combination allows us to restore the high precision on the dark energy sector constraints.Comment: 10 pages, 6 figure

    Quantum improvement of time transfer between remote clocks

    Full text link
    Exchanging light pulses to perform accurate space-time positioning is a paradigmatic issue of physics. It is ultimately limited by the quantum nature of light, which introduces fluctuations in the optical measurements and leads to the so-called Standard Quantum Limit (SQL). We propose a new scheme combining homodyne detection and mode-locked femtosecond lasers that lead to a new SQL in time transfer, potentially reaching the yoctosecond range (10^-21-10^-24 s). We prove that no other measurement strategy can lead to better sensitivity with shot noise limited light. We then demonstrate that this already very low SQL can be overcome using appropriately multimode squeezed light. Benefitting from the large number of photons used in the experiment and from the optimal choice of both the detection strategy and of the quantum resource, the proposed scheme represents a significant potential improvement in space-time positioning
    • …
    corecore